16 research outputs found

    Sensors for Robotic Hands: A Survey of State of the Art

    Get PDF
    Recent decades have seen significant progress in the field of artificial hands. Most of the surveys, which try to capture the latest developments in this field, focused on actuation and control systems of these devices. In this paper, our goal is to provide a comprehensive survey of the sensors for artificial hands. In order to present the evolution of the field, we cover five year periods starting at the turn of the millennium. At each period, we present the robot hands with a focus on their sensor systems dividing them into categories, such as prosthetics, research devices, and industrial end-effectors.We also cover the sensors developed for robot hand usage in each era. Finally, the period between 2010 and 2015 introduces the reader to the state of the art and also hints to the future directions in the sensor development for artificial hands

    Human grasping database for activities of daily living with depth, color and kinematic data streams

    No full text
    This paper presents a grasping database collected from multiple human subjects for activities of daily living in unstructured environments. The main strength of this database is the use of three different sensing modalities: color images from a head-mounted action camera, distance data from a depth sensor on the dominant arm and upper body kinematic data acquired from an inertial motion capture suit. 3826 grasps were identified in the data collected during 9-hours of experiments. The grasps were grouped according to a hierarchical taxonomy into 35 different grasp types. The database contains information related to each grasp and associated sensor data acquired from the three sensor modalities. We also provide our data annotation software written in Matlab as an open-source tool. The size of the database is 172 GB. We believe this database can be used as a stepping stone to develop big data and machine learning techniques for grasping and manipulation with potential applications in rehabilitation robotics and intelligent automation

    Locomotion Strategy Selection for a Hybrid Mobile Robot Using Time of Flight Depth Sensor

    Get PDF
    The performance of a mobile robot can be improved by utilizing different locomotion modes in various terrain conditions. This creates the necessity of having a supervisory controller capable of recognizing different terrain types and changing the locomotion mode of the robot accordingly. This work focuses on the locomotion strategy selection problem for a hybrid legged wheeled mobile robot. Supervisory control of the robot is accomplished by the terrain recognizer, which classifies depth images obtained from a commercial time of flight depth sensor and selects different locomotion mode subcontrollers based on the recognized terrain type. For the terrain recognizer, a database is generated consisting of five terrain classes (Uneven, Level Ground, Stair Up, Stair Down, and Nontraversable). Depth images are enhanced using confidence map based filtering. The accuracy of the terrain classification using Support Vector Machine classifier for the testing database in five-class terrain recognition problem is 97%. Real-world experiments assess the locomotion abilities of the quadruped and the capability of the terrain recognizer in real-time settings. The results of these experiments show depth images processed in real time using machine learning algorithms can be used for the supervisory control of hybrid robots with legged and wheeled locomotion capabilities

    Human grasping database for activities of daily living with depth, color and kinematic data streams

    No full text
    This paper presents a grasping database collected from multiple human subjects for activities of daily living in unstructured environments. The main strength of this database is the use of three different sensing modalities: color images from a head-mounted action camera, distance data from a depth sensor on the dominant arm and upper body kinematic data acquired from an inertial motion capture suit. 3826 grasps were identified in the data collected during 9-hours of experiments. The grasps were grouped according to a hierarchical taxonomy into 35 different grasp types. The database contains information related to each grasp and associated sensor data acquired from the three sensor modalities. We also provide our data annotation software written in Matlab as an open-source tool. The size of the database is 172 GB. We believe this database can be used as a stepping stone to develop big data and machine learning techniques for grasping and manipulation with potential applications in rehabilitation robotics and intelligent automation

    Matlab Visualization Software

    No full text
    Matlab software for visualizing action camera, depth and inertial motion data channels

    10-minute sample record

    No full text
    Sample record of around 10 minutes of experiments, containing action camera, depth and inertial motion data of the corresponding episod

    First portion of depth stream data from human grasp dataset

    No full text
    File containing first portion of depth data acquired during grasping experiments<div><br></div><div><b>Editor's Note: </b>The initial version of this file was found to be corrupted. The latest version corrects this problem.</div

    Third portion of depth stream data from human grasp dataset

    No full text
    File containing third portion of depth data acquired during grasping experiment
    corecore